Search Results: "jason"

18 January 2016

James McCoy: neovim-coming-to-debian

Almost 9 months after I took ownership of the Neovim RFP, I finally tagged & uploaded Neovim to Debian. It still has to go through the NEW queue, but it will soon be in an experimental release near you. I'm holding off uploading it to unstable for the time being for a couple reasons. Many thanks to Jason Pleau for working on getting the necessary parts of the Lua stack needed to support Neovim into Debian.

1 April 2015

Joey Hess: I am ArchiveTeam

This seems as good a day as any to mention that I am a founding member of ArchiveTeam. ArchiveTeam logo Way back, when Geocities was closing down, I was one of a small rag-tag group who saved a copy of most of it. That snapshot has since generated more publicity than most other projects I've worked on. I've heard many heartwarning stories of it being the only remaining copy of baby pictures and writings of deceased friends, and so on. It's even been the subject of serious academic study as outlined in this talk, which is pretty awesome. Jason Scott in full stage regalia I'm happy to let this guy be the public face of ArchiveTeam in internet meme-land. It's a 0.1% project for me, and has grown into a well-oiled machine, albeit one that shouldn't need to exist. I only get involved these days when there's another crazy internet silo fire drill and/or I'm bored. (Rumors of me being the hand model for ArchiveTeam are, however, unsubstantiated.) Relevant XKCD

23 July 2014

Andrew Pollock: [tech] Going solar

With electricity prices in Australia seeming to be only going up, and solar being surprisingly cheap, I decided it was a no-brainer to invest in a solar installation to reduce my ongoing electricity bills. It also paves the way for getting an electric car in the future. I'm also a greenie, so having some renewable energy happening gives me the warm and fuzzies. So today I got solar installed. I've gone for a 2 kWh system, consisting of 8 250 watt Seraphim panels (I'm not entirely sure which model) and an Aurora UNO-2.0-I-OUTD inverter. It was totally a case of decision fatigue when it came to shopping around. Everyone claims the particular panels they want to sell at the best. It's pretty much impossible to make a decent assessment of their claims. In the end, I went with the Seraphim panels because they scored well on the PHOTON tests. That said, I've had other solar companies tell me the PHOTON tests aren't indicative of Australian conditions. It's hard to know who to believe. In the end, I chose Seraphim because of the PHOTON test results, and they're also apparently one of the few panels that pass the Thresher test, which tests for durability. The harder choice was the inverter. I'm told that yield varies wildly by inverter, and narrowed it down to Aurora or SunnyBoy. Jason's got a SunnyBoy, and the appeal with it was that it supported Bluetooth for data gathering, although I don't much care for the aesthetics of it. Then I learned that there was a WiFi card coming out soon for the Aurora inverter, and that struck me as better than Bluetooth, so I went with the Aurora inverter. I discovered at the eleventh hour that the model of Aurora inverter that was going to be supplied wasn't supported by the WiFi card, but was able to switch models to the one that was. I'm glad I did, because the newer model looks really nice on the wall. The whole system was up at running just in time to catch the setting sun, so I'm looking forward to seeing it in action tomorrow. Apparently the next step is Energex has to come out to replace my analog power meter with a digital one. I'm grateful that I was able to get Body Corporate approval to use some of the roof. Being on the top floor helped make the installation more feasible too, I think.

19 June 2014

Joachim Breitner: Another instance of Haskell Bytes

When I gave my Haskell Bytes talk on the runtime representation of Haskell values the first time, I wrote here It is in German, so [..] if you want me to translate it, then (convince your professor or employer to) invite me to hold the talk again . This has just happened: I got to hold the talk as a Tech Talk at Galois in Portland, so now you can fetch the text also in English. Thanks to Jason for inviting me!
This was on my way to the Oregon Summer School on Programming Languages in Eugene, where I m right now enjoying the shade of a tree next to the campus. We ve got a relatively packed program with lectures on dependent types, categorical logic and other stuff, and more student talks in the evening (which unfortunately always collide with the open board game evenings at the local board game store). So at least we started to have a round of diplomacy, where I am about to be crushed from four sides at once. (And no, I don t think that this has triggered the illegal download warning that the University of Oregon received about our internet use and threatens our internet connectivity.)

2 June 2014

B lint R czey: I Can Hear Music again (thanks to forked-daapd/Debian)

When I started looking for a lightweight solution of serving a music library over LAN I did not expect so many complications. I expected it not to be a unique need to have something running on a SheevaPlug straight from the Debian repository. Apparently it kind of was. Debian used to have mt-daapd (popcon: 165), but now it is available from oldstable only and upstream is dead. There is tangerine (popcon: 98) with its Mono dependencies and GUI which seemed to me overkill and more like a demo of a networked application written in Mono than a music library server. The most promising candidate was forked-daapd (popcon: 220) but it was far from being a true winner. First, it had a series of dead upstreams. At the beginning it was forked from mt-daapd (hence the name) by Julien Blache who also served as the prior Debian maintainer. Then the code base was forked and converted to use Grand Central Dispatch. Then the GCD fork died off slowly as well a few years ago. When I found the package it had been unmaintained for a few years and was based on the GCD branch which prevented building it on many architectures and the server itself was crashing or quitting occasionally. Luckily there still existed a fork thanks to Espen J rgensen which was well maintained and could serve as a way out but examining it closely it turned out that it had switched to libevent from GCD but to a version (1.4) which is present only in oldstable! And some say Debian s software versions are ancient ;-). Moreover it was not simply libevent 1.4-based, but it included some heavily patched parts of it. Espen also liked the idea of packaging his version in Debian and we extracted the patches to libevent and slowly got them accepted to libevent s master. Forked-daapd s master works best with libevent 2.1.4-alpha, but thanks to Espen the development branch now also works with libevent 2.0.x giving up some performance and a little feature. This was a long journey, but finally Espen s forked-daapd became ready for being used as a new upstream of the Debian package thus please welcome 20.0+git20140530+gc740e6e-1, the first version of forked-daapd building on all architectures for a very long time and a prime candidate for being the music library server in Jessie (and wheezy-backports, soon)! Testing, bug reports are always welcome! From the package description:
 forked-daapd is an iTunes-compatible media server, originally intended
 as a rewrite of Firefly Media Server (also known as mt-daapd).
 It supports a wide range of audio formats, can stream video to iTunes,
 FrontRow and other compatible clients, has support for Apple's Remote
 iPhone/iPod application and can stream music to AirTunes devices like
 the AirPort Express.
 It also features RSP support for Roku's SoundBridge devices.
 Built-in, on-the-fly decoding support enables serving popular free music
 formats like FLAC, Ogg Vorbis or Musepack to those clients that do not
 otherwise support them.

17 April 2014

Andrew Pollock: [life] Day 79: Magic, flu shots, and play dates and dinner

Zoe slept until 7:45am this morning, which is absolutely unheard of in our house. She did wake up at about 5:15am yelling out for me because she'd kicked her doona off and lost Cowie, but went back to sleep once I sorted that out. She was super grumpy when she woke up, which I mostly attributed to being hungry, so I got breakfast into her as quickly as possible and she perked up afterwards. Today there was a free magic show at the Bulimba Library at 10:30am, so we biked down there. I really need to work on curbing Zoe's procrastination. We started trying to leave the house at 10am, and as it was, we only got there with 2 minutes to spare before the show started. Magic Glen put on a really good show. He was part comedian, part sleight of hand magician, and he did a very entertaining show. There were plenty of gags in it for the adults. Zoe started out sitting in my lap, but part way through just got up and moved closer to the front to sit with the other kids. I think she enjoyed herself. I'd have no hesitation hiring this guy for a future birthday party. Zoe had left her two stuffed toys from the car at Megan's house on Tuesday after our Port of Brisbane tour, and so after the magic show we biked to her place to retrieve them. It was close to lunch by this stage, so we stayed for lunch, and the girls had a bit of a play in the back yard while Megan's little sister napped. It was getting close to time to leave for our flu shots, so I decided to just bike directly to the doctor from Megan's place. I realised after we left that we'd still left the stuffed toys behind, but the plan was to drive back after our flu shots and have another swim their neighbour's pool, so it was all good. We got to the doctor, and waited for Sarah to arrive. Sarah and I weren't existing patients at Zoe's doctor, but we'd decided to get the flu shot as a family to try and ease the experience for Zoe. We both had to do new patient intake stuff before we had a consult with Zoe's doctor and got prescriptions for the flu shot. I popped next door to the adjacent pharmacy get the prescriptions filled, and then the nurse gave us the shots. For the last round of vaccinations that Zoe received, she needed three, and she screamed the building down at the first jab. The poor nurse was very shaken, so we've been working to try and get her to feel more at ease about this one. Zoe went first, and she took a deep breath, and she was winding up to freak out when she had her shot, but then it was all over, and she let the breath go, and looked around with a kind of "is that it?" reaction. She didn't even cry. I was so proud of her. I got my shot, and then Sarah got hers, and we had to sit in the waiting room for 10 minutes to make sure we didn't turn into pumpkins, and we were on our way. We biked home, I grabbed our swim gear, and we drove back to Megan's place. The pool ended up being quite cold. Megan didn't want to get in, and Zoe didn't last long either. Megan's Mum was working back late, so I invited Megan, her Dad and her sister over for dinner, and we headed home so I could prepare it. One of Zoe's stuffed toys had been located. We had a nice dinner of deviled sausages made in the Thermomix, and for a change I didn't have a ton of leftovers. Jason had found the other stuffed toy in his truck, so we'd finally tracked them both down. After Megan and family went home, I got Zoe to bed without much fuss, and pretty much on time. I think she should sleep well tonight.

15 April 2014

Andrew Pollock: [life] Day 77: Port of Brisbane tour

Sarah dropped Zoe around this morning at about 8:30am. She was still a bit feverish, but otherwise in good spirits, so I decided to stick with my plan for today, which was a tour of the Port of Brisbane. Originally the plan had been to do it with Megan and her Dad, Jason, but Jason had some stuff to work on on his house, so I offered to take Megan with us to allow him more time to work on the house uninterrupted. I was casting around for something to do to pass the time until Jason dropped Megan off at 10:30am, and I thought we could do some foot painting. We searched high and low for something I could use as a foot washing bucket, other than the mop bucket, which I didn't want to use because of potential chemical residue. I gave up because I couldn't anything suitable, and we watched a bit of TV instead. Jason dropped Megan around, and we immediately jumped in the car and headed out to the Port. I missed the on ramp for the M4 from Lytton Road, and so we took the slightly longer Lytton Road route, which was fine, because we had plenty of time to kill. The plan was to get there for about 11:30am, have lunch in the observation cafe on the top floor of the visitor's centre building, and then get on the tour bus at 12:30pm. We ended up arriving much earlier than 11:30am, so we looked around the foyer of the visitor's centre for a bit. It was quite a nice building. The foyer area had some displays, but the most interesting thing (for the girls) was an interactive webcam of the shore bird roost across the street. There was a tablet where you could control the camera and zoom in and out on the birds roosting on a man-made island. That passed the time nicely. One of the staff also gave the girls Easter eggs as we arrived. We went up to the cafe for lunch next. The view was quite good from the 7th floor. On one side you could look out over the bay, notably Saint Helena Island, and on the other side you got quite a good view of the port operations and the container park. Lunch didn't take all that long, and the girls were getting a bit rowdy, running around the cafe, so we headed back downstairs to kill some more time looking at the shore birds with the webcam, and then we boarded the bus. It was just the three of us and three other adults, which was good. The girls were pretty fidgety, and I don't think they got that much out of it. The tour didn't really go anywhere that you couldn't go yourself in your own car, but you did get running commentary from the driver, which made all the difference. The girls spent the first 5 minutes trying to figure out where his voice was coming from (he was wired up with a microphone). The thing I found most interesting about the port operations was the amount of automation. There were three container terminals, and the two operated by DP World and Hutchinson Ports employed fully automated overhead cranes for moving containers around. Completely unmanned, they'd go pick a container from the stack and place it on a waiting truck below. What I found even more fascinating was the Patrick terminal, which used fully automated straddle carriers, which would, completely autonomously move about the container park, pick up a container, and then move over to a waiting truck in the loading area and place it on the truck. There were 27 of these things moving around the container park at a fairly decent clip. Of course the girls didn't really appreciate any of this, and half way through the tour Megan was busting to go to the toilet, despite going before we started the tour. I was worried about her having an accident before we got back, she didn't, so it was all good. I'd say in terms of a successful excursion, I'd score it about a 4 out of 10, because the girls didn't really enjoy the bus tour all that much. I was hoping we'd see more ships, but there weren't many (if any) in port today. They did enjoy the overall outing. Megan spontaneously thanked me as we were leaving, which was sweet. We picked up the blank cake I'd ordered from Woolworths on the way through on the way home, and then dropped Megan off. Zoe wanted to play, so we hung around for a little while before returning home. Zoe watched a bit more TV while we waited for Sarah to pick her up. Her fever picked up a bit more in the afternoon, but she was still very perky.

10 April 2014

Andrew Pollock: [life] Day 72: The Workshops, and zip lining into a pool

Today was jam packed, from the time Zoe got dropped off to the time she was picked up again. I woke up early to go to my yoga class. It had moved from 6:15am to 6:00am, but was closer to home. I woke up a bunch of times overnight because I wanted to make sure I got up a little bit earlier (even though I had an alarm set) so I was a bit tired. Sarah dropped Zoe off, and we quickly inspected our plaster fish from yesterday. Because the plaster had gotten fairly thick, it didn't end up filling the molds completely, so the fish weren't smooth. Zoe was thrilled with them nonetheless, and wanted to draw all over them. After that, we jumped in the car to head out to The Workshops Rail Museum. We were meeting Megan there. We arrived slightly after opening time. I bought an annual membership last time we were there, and I'm glad we did. The place is pretty good. It's all indoors, and it's only lightly patronised, even for school holidays, so it was nice and quiet. Megan and her Dad and sister arrived about an hour later, which was good, because it gave Zoe and I a bit of time to ourselves. We had plenty of time on the diesel engine simulator without anyone else breathing down our neck wanting a turn. The girls all had a good time. We lost Megan and Zoe for a little bit when they decided to take off and look at some trains on their own. Jason and I were frantically searching the place before I found them. There was a puppet show at 11am, and the room it was in was packed, so we plonked all three kids down on the floor near the stage, and waited outside. That was really nice, because the kids were all totally engrossed, and didn't miss us at all. After lunch and a miniature train ride we headed home. Surprisingly, Zoe didn't nap on the way home. Jason was house sitting for some of his neighbours down the street, and he'd invited us to come over and use their pool, so we went around there once we got back home. The house was great. They also had a couple of chickens. The pool was really well set up. It had a zip line that ran the length of the pool. Zoe was keen to give it a try, and she did really well, hanging on all the way. They also had a little plastic fort with a slippery slide that could be placed at the end of the pool, and the girls had a great time sliding into the pool that way. We got back home from all of that fun and games about 15 minutes before Sarah arrived to pick Zoe up, so it was really non-stop day.

7 April 2014

Andrew Pollock: [life] Day 69: Walk to King Island, a picnic at Wellington Point, the long slow acquisition of some linseed and a split lip

Today was a really good day, right up until the end, when it wasn't so good, but could have been a whole lot worse, so I'm grateful for that. I've been wanting to walk out to King Island at low tide with Zoe for a while, but it's taken about a month to get the right combination of availability, weather and low tide timing to make it possible. Today, there was a low tide at about 10:27am, which I thought would work out pretty well. I wasn't sure if the tide needed to be dead low to get to King Island, so I thought we could get there a bit early and possibly follow the tide out. I invited Megan and Jason to join us for the day and make a picnic of it. It turned out that we didn't need a really low tide, the sand bar connecting King Island to Wellington Point was well and truly accessible well before low tide was reached, so we headed out as soon as we arrived. I'd brought Zoe's water shoes, but from looking at it, thought it would be walkable in bare feet. We got about 10 metres out on the sand and Zoe started freaking out about crabs. I think that incident with the mud crab on Coochiemudlo Island has left her slightly phobic of crabs. So I went back to Jason's car and got her water shoes. I tried to allay her fears a bit by sticking my finger in some of the small holes in the sand, and even got her to do it too. I'm actually glad that I did get her water shoes, because the shell grit got a bit sharp and spiky towards King Island, so I probably would have needed to carry her more than I did otherwise. Along the way to the island we spotted a tiny baby mud crab, and Zoe was brave enough to hold it briefly, so that was good. We walked all the way out and partially around the island and then across it before heading back. The walk back was much slower because where was a massive headwind. Zoe ran out of steam about half way back. She didn't like the sand getting whipped up and stinging her legs, and the wind was forcing the brim of her hat down, so I gave her a ride on my shoulders for the rest of the way back. We had some lunch after we got back to Wellington Point, and Zoe found her second wind chasing seagulls around the picnic area. After an ice cream, we went over to the playground and the girls had a great time playing. It was a pretty good park. There was this huge tree with a really big, thick, horizontal branch only about a metre or two off the ground. All the kids were climbing on it and then shimmying along the branch to the trunk. Zoe's had a few climbs in trees and seems not afraid of it, so she got up and had a go. She did really well and did a combination of scooting along, straddling the branch and doing a Brazilian Jiu-Jitsu-style "bear crawl" along the branch. It was funny seeing different kids' limits. Zoe was totally unfazed by climbing the tree. Megan was totally freaking out. But when it came to walking in bare feet in an inch of sea water, Zoe wanted to climb up my leg like a rat up a rope, in case there were crabs. Each to their own. Zoe wanted to have a swim in the ocean, so I put her into her swimsuit, but had left the water shoes back in the car. Once again, she freaked out about crabs as soon as we got ankle deep in the water, and was freaking out Megan as well, so the girls elected to go back to playing in the park. After a good play in the park, we headed back home. We'd carpooled in Jason's truck, with both girls in the back. I'd half expected Zoe to fall asleep on the way back, but the girls were very hyped up and had a great time playing games and generally being silly in the back. When we got back to our place, Jason was in need of a coffee, so we walked to the Hawthorne Garage and had coffee and babyccinos, before Megan and Jason went home. It was about 3:30pm at this point, and I wanted to make a start on dinner. I was making a wholemeal pumpkin quiche, which I've made a few times before, and I discovered we were low on linseed. I thought I'd push things and see if Zoe was up for a scooter ride to the health food shop to get some more and kill some time. She was up for it, but ran out of steam part way across Hawthorne Park. Fortunately she was okay with walking and didn't want me to carry her and the scooter. It took us about an hour to get to the health food shop. Zoe immediately remembered the place from the previous week where we'd had to stop for a post-meltdown pit stop and declared she needed to go to the toilet again. We finally made it out of the shop. I wasn't looking forward to the long walk back home, but there were a few people waiting for a bus at the bus stop near the health food shop, and on checking the timetable, the bus was due in a couple of minutes, so we just waited for the bus. That drastically shortened the trip back. Zoe managed to drop the container of linseed on the way home from the bus stop, but miraculously the way it landed didn't result in the loss of too much of the contents, it just split the container. So I carefully carried the container home the rest of the way. By this stage it was quite a bit later than I had really wanted to be starting dinner, but we got it made, and Zoe really liked the pumpkin quiche, and ate a pretty good dinner. It was after dinner when things took a turn for the worse. Zoe was eating an ice block for dessert, and for whatever reason, she'd decided to sit in the corner of the kitchen next to the dishwasher, while I was loading it. I was carrying over one of the plates, and the knife managed to fall off the plate, bounce off the open dishwasher door and hit her in the mouth, splitting her lip. Zoe was understandably upset, and I was appalled that the whole thing had happened. She never sits on the kitchen floor, let alone in the corner where the dishwasher is. And this knife came so close to her eye. Fortunately the lip didn't look too bad. It stopped bleeding quickly, and we kept some ice on it and the swelling went down. I hate it when accidents happen on my watch. I feel like I'm fighting the stigma of the incompetent single Dad, or the abusive single Dad, so when Zoe sustains an injury to the face like a fat lip, which could be misinterpreted, I, well, really hate it. This was such a freak accident, and it could have gone so much worse. I'm just so glad she's okay. Zoe recovered pretty well from it, and I was able to brush her teeth without aggravating her lip. She went to bed well, and I suspect she's going to sleep really well. It's a bit cooler tonight, so I'm half-expecting a sleep in in the morning with any luck.

3 April 2014

Andrew Pollock: [life] Day 65: Playgroup, and the foil confetti play date

Zoe slept pretty well last night. She only woke up briefly at 4am because Cowie had fallen out of bed and she couldn't find her. Today was the last Playgroup of the term. Megan, her little sister and her Dad came as well to check it out, which was nice, because Zoe then had someone to actively play with in addition to me. After Playgroup, we went to the adjacent Bulimba Memorial Park with Megan, and then had some lunch at Grill'd. Megan's Dad wanted to do some work on their house while Megan's little sister napped, so I offered to give Megan a play date at our place. The plan was to watch a movie and chill out. The girls picked Ratatouille and I made a batch of popcorn for them. Unfortunately Megan seemed to be less of a square eyes than Zoe, and she lost interest after a bit, so we stopped watching the movie and moved out to the balcony to do some craft. Zoe had been wanting to make a crown for Mummy's boss for a while, so we made a couple of crowns with the hot glue gun. I had bought this bag of mixed craft "jewels" and it's probably the best single craft thing I've bought. Zoe loves gluing them onto everything. After that, Zoe pulled out the bag of coloured foil confetti. If the gems were the best thing I've bought, this would have to be the worst. So far, all it's done is leak in the drawer it's been stored in, and I've been avoiding using it because it was going to be messy. Today, Zoe wanted to glue it onto the outside of her cardboard box, so I decided to give in and embrace the mess, and boy, did we make a mess. It probably ended up being the longest bit of cooperative play the girls did. They'd alternate between handing each other a fistful of confetti while I applied globs of glue where directed. Probably about 10 percent of each handful ended up stuck to the rocket, so the balcony looked like quite a mess by the end of it all, but at least it was a dry mess, so I could just vacuum it all up. I suspect I'll be encountering dregs for quite a while, because I doubt it's stuck to the cardboard particularly well. After that, the girls played indoors for a bit, and watched a bit more of the movie, but Megan seemed to be scared of Anton Ego, so I think that was why it wasn't holding her attention. The other activity that the girls seemed to thoroughly enjoy was tearing around the living room squealing while they took turns at throwing a grapefruit-sized beach ball at me, and I threw it back at them. Jason came back to pick up Megan, and I started dinner. Not that long after Megan left, Sarah arrived to watch Zoe for me so I could go visit my cousin in hospital. I had dinner on the table pretty much as soon as she walked in the door, and headed out.

18 January 2014

James Bromberger: Linux.conf.au 2014: LCA TV

The radio silence here on my blog has been not from lack of activity, but the inverse. Linux.conf.au chewed up the few remaining spare cycles I have had recently (after family and work), but not from organising the conference (been there, got the T-Shirt and the bag). So, let s do a run through of what has happened LCA2014 Perth has come and gone in pretty smooth fashion. A remarkable effort from the likes of the Perth crew of Luke, Paul, Euan, Leon, Jason, Michael, and a slew of volunteers who stepped up not to mention our interstate firends of Steve and Erin, Matthew, James I, Tim the Streaming guy and others, and our pro organisers at Manhattan Events. It was a reasonably smooth ride: the UWA campus was beautiful, the leacture theatres were workable, and the Octogon Theatre was at its best when filled with just shy of 500 like minded people and an accomplished person gracing the stage. What was impressive (to me, at least) was the effort of the AV team (which I was on the extreme edges of); videos of keynotes hit the Linux Australia mirror within hours of the event. Recording and live streaming of all keynotes and sessions happend almost flawlessly. Leon had built a reasonably robust video capture management system (eventstreamer on github) to ensure that people fresh to DVswitch had nothing break so bad it didn t automatically fix itself and all of this was monitored from the Operations Room (called the TAVNOC, which would have been the AV NOC, but somehow a loose reference to the UWA Tavern the Tav crept in there). Some 167 videos were made and uploaded most of this was also mirrored on campus before th end of the conference so attendees could load up their laptops with plenty of content for the return trip home. Euan s quick Blender work meant there was a nice intro and outro graphic, and Leon s scripting ensured that Zookeepr, the LCA conference manegment software, was the source of truth in getting all videos processed and tagged correctly. I was scheduled (and did give) a presentation at LCA 2014 about Debian on Amazon Web Services (on Thursday), and attended as many of the sessions as possible, but my good friend Michael Davies (LCA 2004 chair, and chair of the LCA Papers Committee for a good many years) had another role for this year. We wanted to capture some of the hallway track of Linux.conf.au that is missed in all the videos of presentations. And thus was born LCA TV. LCA TV consisted of the video equipment for an additional stream mixer host, cameras, cables and switches, hooking into the same streaming framework as the rest of the sessions. We took over a corner of the registration room (UWA Undercroft), brought in a few stage lights, a couch, coffee table, seat, some extra mics, and aimed to fill the session gaps with informal chats with some of the people at Linux.conf.au speakers, attendees, volunteers alike. And come they did. One or two interviews didn t succeed (this was an experiment), but in the end, we ve got over 20 interviews with some interesting people. These streamed out live to the people watching LCA from afar; those unable to make it to Perth in early January; but they were recorded too and we can start to watch them (see below) I was also lucky enough to mix the video for the three keynotes as well as the opening and closing, with very capable crew around the Octogon Theatre. As the curtain came down, and the 2014 crew took to the stage to be congratulated by the attendees, I couldn t help but feel a little bit proud and a touch nostalgic memories from 11 years earlier when LCA 2003 came to a close in the very same venue. So, before we head into the viewing season for LCA TV, let me thank all the volunteers who organised, the AV volunteers, the Registration volunteers, the UWA team who helped with Octogon, Networking, awesome CB Radios hooked up to the UWA repeated that worked all the way to the airport. Thanks to the Speakers who submitted proposals. The Speakers who were accepted, made the journey and took to the stage. The people who attended. The sponsors who help make this happen. All of the above helps share the knowledge, and ultimately, move the community forward. But my thanks to Luke and Paul for agreeing to stand there in the middle of all this madness and hive of semi structured activity that just worked. Please remember this was experimental; the noise was the buzz of the conference going on around us. There was pretty much only one person on the AV kit my thanks to Andrew Cooks who I ll dub as our sound editor, vision director, floor manager, and anything else. So who did we interview? One or two talks did not work, so appologies to those that are missing. Here s the playlist to start you off! Enjoy.

1 January 2014

Russ Allbery: 2013 Book Reading in Review

What a strange year. 2013 was marked by a whole sequence of entirely unexpected events, including multiple major work upheavals. For large chunks of the year, I had very little time or emotional energy for personal reading goals, and particularly for writing reviews. I declared personal amnesty on most of my intentions halfway through the year, and all the totals will reflect that. On the plus side (although not for reading and reviews), it was a great year for video games. Next year, there will be no specific goals. Between continuing work fallout, a very busy project schedule, my intent to keep playing a lot of video games, and various other personal goals I want to take on, I'm going to take the pressure off of reading. Things will be read and reviews will be written (and I'm going to make more of an effort to write reviews shortly after reading books), but I'm not going to worry about how many. The below statistics are confined to the books I reviewed in 2013. I read six more books that I've not yet reviewed, due to the chaos at the end of the year. Those will be counted in 2014. There were no 10 out of 10 books this year, partly due to the much lower reading totals and partly due to my tendency this year to turn to safe comfort reading, which is reliably good but unlikely to be exceptional. There were, however, several near-misses that were worth calling out. My favorite book of the year was Neal Stephenson's Anathem, which narrowly missed a 10 for me due to some fundamental problems with the plot premise. But this is still an excellent book: the best novel about the practice of science and philosophy that I've ever read. Also deserving mention are K.E. Lane's And Playing the Role of Herself, lovely and intelligent lesbian romance that's likely to appeal even to people who would not normally try that genre, and Guy Gavriel Kay's River of Stars. The latter isn't quite at the level of Kay's earlier Under Heaven, but it's still an excellent work of alternate historical fiction in a memorable setting. A special honorable mention goes to Lisa O'Donnell's The Death of Bees. It requires a lot of warnings for very dark subject matter and a rather abrupt ending, but it's been a long time since I've cared that much about the characters of a book. My favorite non-fiction book of the year was Gary J. Hudson's They Had to Go Out, a meticulously researched account of a tragic Coast Guard mission. The writing is choppy, the editing could have been better, and it's clear that the author is not a professional writer, but it's the sort of detailed non-fiction account that can only be written by someone who's been there and lived through similar experiences. Also worth mentioning is Mark Jason Dominus's Higher Order Perl, which was the best technical book I read all year and which I found quite inspiring for my own programming. The full analysis includes some additional personal reading statistics, probably only of interest to me.

27 December 2013

Russell Coker: Sound Device Order with ALSA

One problem I have had with my new Dell PowerEdge server/workstation [1] is that sound doesn t work correctly. When I initially installed it things were OK but after installing a new monitor sound stopped working. The command aplay -l showed the following:
**** List of PLAYBACK Hardware Devices ****
card 0: Generic [HD-Audio Generic], device 3: HDMI 0 [HDMI 0]
Subdevices: 1/1
Subdevice #0: subdevice #0
card 1: Speaker [Logitech USB Speaker], device 0: USB Audio [USB Audio]
Subdevices: 1/1
Subdevice #0: subdevice #0 So the HDMI sound hardware (which had no speakers connected) became ALSA card 0 (default playback) and the USB speakers became card 1. It should be possible to convert KDE to use card 1 and then have other programs inherit this, but I wasn t able to configure that with Debian/Wheezy. My first attempt at solving this was to blacklist the HDMI and motherboard drivers (as suggested by Lindsay on the LUV mailing list). I added the following to /etc/modprobe.d/hdmi-blacklist.conf:
blacklist snd_hda_codec_hdmi
blacklist snd_hda_intel Blacklisting the drivers works well enough. But the problem is that I will eventually want to install HDMI speakers to get better quality than the old Logitech portable USB speakers and it would be convenient to have things just work. Jason white suggested using the module options to specify the ALSA card order. The file /etc/modprobe.d/alsa-base.conf in Debian comes with an entry specifying that the USB driver is never to be card 0, which is exactly what I don t want. So I commented out the previous option for snd-usb-audio and put in the following ones to replace it:
# make USB 0 and HDMI/Intel anything else
options snd-usb-audio index=0
options snd_hda_codec_hdmi=-2
options snd_hda_intel=-2 Now I get the following from aplay -l and both KDE and mplayer will play to the desired card by default:
**** List of PLAYBACK Hardware Devices ****
card 0: Speaker [Logitech USB Speaker], device 0: USB Audio [USB Audio]
Subdevices: 1/1
Subdevice #0: subdevice #0
card 1: Generic [HD-Audio Generic], device 3: HDMI 0 [HDMI 0]
Subdevices: 1/1
Subdevice #0: subdevice #0

8 September 2013

Joachim Breitner: You-Say-First has dice

A while ago I wrote a small web application called You-Say-First. It allows multiple users to join a chat room where they can, besides chatting, enter moves that are only shown to everyone once everyone has submitted one. This way you can play games like Rock-Stone-Scissors or Diplomacy online, even if you want to play by your own strange rules that none of the standard, game-specific web sites support. Or even make up many kind of games on the spot!
I was approached by Jason Worley who said he liked the site, but would have a need for dice rolling to play the games he wants to play. So he gets dice rolling: Just say, in the chat, what kind of dice you want and the server will roll them for you and tell everyone about them. I tried to make it understand many variations of the command, so you can say roll a die, please , toss five coins or throw 3D20 . If the server does not react, try modifying your command. As always I m happy about feedback, bugreports and patches, and I hope that it is a useful tool for you.

2 July 2013

Ondřej Čertík: My impressions from the SciPy 2013 conference

I have attended the SciPy 2013 conference in Austin, Texas. Here are my impressions.

Number one is the fact that the IPython notebook was used by pretty much everyone. I use it a lot myself, but I didn't realize how ubiquitous it has become. It is quickly becoming the standard now. The IPython notebook is using Markdown and in fact it is better than Rest. The way to remember the "[]()" syntax for links is that in regular text you put links into () parentheses, so you do the same in Markdown, and append [] for the text of the link. The other way to remember is that [] feel more serious and thus are used for the text of the link. I stressed several times to +Fernando Perez and +Brian Granger how awesome it would be to have interactive widgets in the notebook. Fortunately that was pretty much preaching to the choir, as that's one of the first things they plan to implement good foundations for and I just can't wait to use that.

It is now clear, that the IPython notebook is the way to store computations that I want to share with other people, or to use it as a "lab notebook" for myself, so that I can remember what exactly I did to obtain the results (for example how exactly I obtained some figures from raw data). In other words --- instead of having sets of scripts and manual bash commands that have to be executed in particular order to do what I want, just use IPython notebook and put everything in there.

Number two is that how big the conference has become since the last time I attended (couple years ago), yet it still has the friendly feeling. Unfortunately, I had to miss a lot of talks, due to scheduling conflicts (there were three parallel sessions), so I look forward to seeing them on video.

+Aaron Meurer and I have done the SymPy tutorial (see the link for videos and other tutorial materials). It's been nice to finally meet +Matthew Rocklin (very active SymPy contributor) in person. He also had an interesting presentation
about symbolic matrices + Lapack code generation. +Jason Moore presented PyDy.
It's been a great pleasure for us to invite +David Li (still a high school student) to attend the conference and give a presentation about his work on sympygamma.com and live.sympy.org.

It was nice to meet the Julia guys, +Jeff Bezanson and +Stefan Karpinski. I contributed the Fortran benchmarks on the Julia's website some time ago, but I had the feeling that a lot of them are quite artificial and not very meaningful. I think Jeff and Stefan confirmed my feeling. Julia seems to have quite interesting type system and multiple dispatch, that SymPy should learn from.

I met the VTK guys +Matthew McCormick and +Pat Marion. One of the keynotes was given by +Will Schroeder from Kitware about publishing. I remember him stressing to manage dependencies well as well as to use BSD like license (as opposed to viral licenses like GPL or LGPL). That opensource has pretty much won (i.e. it is now clear that that is the way to go).

I had great discussions with +Francesc Alted, +Andy Terrel, +Brett Murphy, +Jonathan Rocher, +Eric Jones, +Travis Oliphant, +Mark Wiebe, +Ilan Schnell, +St fan van der Walt, +David Cournapeau, +Anthony Scopatz, +Paul Ivanov, +Michael Droettboom, +Wes McKinney, +Jake Vanderplas, +Kurt Smith, +Aron Ahmadia, +Kyle Mandli, +Benjamin Root and others.


It's also been nice to have a chat with +Jason Vertrees and other guys from Schr dinger.

One other thing that I realized last week at the conference is that pretty much everyone agreed on the fact that NumPy should act as the default way to represent memory (no matter if the array was created in Fortran or other code) and allow manipulations on it. Faster libraries like Blaze or ODIN should then hook themselves up into NumPy using multiple dispatch. Also SymPy would then hook itself up so that it can be used with array operations natively. Currently SymPy does work with NumPy (see our tests for some examples what works), but the solution is a bit fragile (it is not possible to override NumPy behavior, but because NumPy supports general objects, we simply give it SymPy objects and things mostly work).

Similar to this, I would like to create multiple dispatch in SymPy core itself, so that other (faster) libraries for symbolic manipulation can hook themselves up, so that their own (faster) multiplication, expansion or series expansion would get called instead of the SymPy default one implemented in pure Python.

Other blog posts from the conference:

14 March 2013

Russ Allbery: Review: Higher Order Perl

Review: Higher Order Perl, by Mark Jason Dominus
Publisher: Morgan Kaufmann
Copyright: 2005
ISBN: 1-55860-701-3
Format: Trade paperback
Pages: 563
Disclaimer: The author and I both used to be (technically still are, but the newsgroup is dead) moderators of comp.lang.perl.moderated, and I knew him electronically from answering questions on the Perl Usenet newsgroups many years ago. Perl is, in general construction, a typical imperative language in the style of C, with object-oriented capabilities bolted (somewhat awkwardly) on. Most Perl code looks remarkably like C code with more powerful built-in data types and no memory management. A minority of Perl programs embrace object-oriented techniques and remind one of C++ or Java (or Python). But Perl also has powerful capabilities borrowed from functional programming languages and the Lisp tradition, even though many Perl programmers rarely use them and aren't familiar with them. Higher Order Perl focuses on those capabilities and how to use them effectively, starting with callbacks and uses of function pointers and moving into recursion and analysis of recursive functions, iterators, currying, and parsers. It concludes with a fully-worked example of constructing a declarative programming system using the techniques developed earlier in the book. Higher Order Perl is a programming book for an intermediate to advanced Perl programmer, already a rare topic. This is not a look at how to apply Perl to another area of programming, or a cookbook of techniques. It's an attempt to help a Perl programmer think about and use the language differently, and it follows through on that. It's refreshing and rare to read a programming technique book that's targeted at the practicing expert. (I think the size of the audience is often too small for publishers to target it.) Even the most experienced Perl programmer is probably going to learn something fundamental from this book, not just interesting trivia around the edges. On the negative side, though, I found the book a bit too focused on computer science and mathematics problems, particularly in the choice of examples and sample scripts. (And I say this as someone who has a master's degree in computer science with a software theory focus.) This is a bit hard to avoid for topics like recursion, where problems like computing Fibonacci numbers are classic, but throughout the book I struggled to focus past feelings of "but why would I ever do that"? The extended discussion of the Newton-Raphson method is the most memorable; I'm not sure that's a problem many Perl programmers would have and need a higher-order technique for. There's also a lot of discussion of recursion analysis and transformations between recursive and iterative expressions of problems, which is ground I remember well from my degree but which I've rarely had any practical use for in day-to-day programming. This is not a uniform problem, though, just a tendency. There are some great examples that I think are more in the mainstream of Perl problems, including a reinvention of File::Find that shows how to add more flexibility, a web spider, and a great discussion on how to construct a conventional queryable database on top of a flat file (a topic that's near and dear to my heart). And then there's the chapter on infinite streams. Dominus presents a method of using closures to create a version of an interator: a sort of infinite linked list that can keep generating additional elements. He presents this in several contexts, but one of them is log parsing, and that turned out to be exactly the solution that I needed for a problem I was working on while reading this book. I've written about this elsewhere, but this was a wonderful idea that helped me think about both Perl and a major application area in a completely new way, and I wrote an application using that knowledge that would have taken me much longer using different techniques and would have been much less fun. So, for me, this chapter was more than worth the entire book, and blankets the rest of it in a delighted feeling. Other people may or may not have that experience. I think it will depend on whether one of these techniques hits home for you the way that one did for me. This is a book with some idiosyncracies, and some sections that may drag. Having lots of fully-worked examples is a major plus, but some of those are so comprehensive that one can get a bit lost in the details. That particularly hit me with the last couple of chapters on parsing and on the example declarative programming application. Quite a bit of that text involved reinventing a recursive descent parser (another very computer science example), when I'm not quite sure why one wouldn't use one of the existing parser generators on CPAN for practical purposes, and inventing a lot of new syntax to try to make parser Perl code more readable. But one certainly can't complain that Dominus omits necessary details, and there is some appeal in watching an experienced programmer work through a problem from analysis to implementation. But, despite the idiosyncracies, I recommend this book to any experienced Perl programmer who wants to expand their view of the language. The techniques here closures, higher-order functions, iterators and streams, and formal parsers are powerful and underused in the Perl community. You may want to pick and choose which sections you pay close attention to, but I think everyone will find something of interest here. In the weeks since I read this book, my opinion of it has only grown. And I can't tell you just how much I loved the infinite stream concept. Rating: 8 out of 10

1 February 2013

James Bromberger: LCA 2013

LCA Past Organisers

Previous core organisers of Linux.conf.au, taken at Mt Stromolo Observatory during LCA 2013 (pic by Josh Stewart); except one of these people organised CALU, and another hasn t organised one at all!

Thanks to all the people at LCA2013 in Canberra; it was a blast! So good to see old friends and chat freely on what s hot and happening. Radia (known for STP, TRILL), Sir Tim (the web) and old friend Bdale (Debian, SPI, Freedom Box) were inspiring. As was Robert Llewellyn (Kryten, Red Dwarf), who was a complete pleasure he wandered back and talked for a while with the volunteer video crew. Hats off to Pia for organising the TBL tour, to Mary Gardner for being awarded the Rusty Wrench, and to the team from PLUG (Euan, Jason, Leon, Luke) who stepped up to help with the video team and to Paul who graciously accepted the help. Next up LCA2014 Perth! Y all come back now.. it s been a decade.

23 January 2013

Russ Allbery: Log parsing and infinite streams

I have a problem I have to solve for work that involves correlating Apache access and error logs. Part of WebAuth logs successful authentications to the Apache error log, and I want to correlate User-Agent strings with users so that we can figure out what devices our users are using by percentage of users rather than percentage of hits. The problem, as those who have tried to do this prior to Apache 2.4 know, is that Apache doesn't provide any easy way to correlate access and error log entries (made even more complex because two separate components are involved). I could have just hacked something together, but I've written way too many ad hoc log parsers, and, see, I was reading this book.... The book in question is Mark Jason Dominus's Higher Order Perl. I'm not quite done with it, and will post a full review when I am. I have some problems with it, mostly around the author's choice of example problems. But there is one chapter on infinite streams, and the moment I read that chapter on the train, it struck me as the perfect solution to log parsing problems. I'm not much of a functional programming person (which is where Dominus is drawing most of the material for this book), so I don't know if this terminology is standard or somewhat unique to the book. An infinite stream in this context is basically a variation on an interator that lets you look at the next item without consuming it. The power comes from putting this modified iterator in front of a generator and use it to consume one element at a time, and then compose this with transformation and filtering functions. That gives you all the power of the iterator to store state and lets you process one logical element from the stream at a time, without inverting your flow of control. Dominus provides code in the book for a very nice functional implementation of this that's about as close as you're probably going to get to writing Haskell in Perl. Unfortunately, his publisher decided to write their own free software license, so the license is kind of weird and doesn't explicitly permit redistribution of modified code. It's probably fine, but I didn't feel like dealing with it, and I'm more comfortable with writing object-oriented code at the moment (at least in Perl), so I decided to write an object-oriented version of the same code specific for log parsing. That's what I've been doing since shortly after lunch, and I can't remember the last time I've had this much fun writing code. I have a reasonable first cut at a (fully tested and fully test-driven) log parsing framework built on top of a reasonably generic implementation of the core ideas behind infinite streams. I also used this as an opportunity to experiment with Module::Build, and have discovered that the things I most disliked about it have apparently all been fixed. And I'm also using Perl 5.10 features. (I was tempted to start using some things from 5.12, but I do actually need to run this on Debian stable.) It's rather satisfying to write a thoroughly modern Perl module. There are some definite drawbacks to writing this in an object-oriented fashion. There's rather more machinery and glue that has to be set up, it's probably a bit slower, and it tends to accumulate layers of calls. One of the advantages of the method with standalone functions and a very simple, transparent data structure is that it's easier to eliminate unnecessary call nesting. But I suspect the object-oriented version will do what I want without any difficulties, and if I feel very inspired, I can always fiddle with it later. Maybe I'll eventually use this as a project to experiment with Moose as well. I'm surprised that no one else has done this, but I poked around on CPAN a fair bit and couldn't find anything. This will all show up on CPAN (as Log::Stream) as soon as I've finished enough of it to implement my sample application. And then I'll hopefully find some time to rewrite our metrics system using it, which should simplify it considerably....

26 November 2012

Russell Coker: Links November 2012

Julian Treasure gave an informative TED talk about The 4 Ways Sound Affects US [1]. Among other things he claims that open plan offices reduce productivity by 66%! He suggests that people who work in such offices wear headphones and play bird-songs. Naked Capitalism has an interesting interview between John Cusack and Jonathan Turley about how the US government policy of killing US citizens without trial demonstrates the failure of their political system [2]. Washington s blog has an interesting article on the economy in Iceland [3]. Allowing the insolvent banks to go bankrupt was the best thing that they have ever done for their economy. Clay Shirky wrote an insightful article about the social environment of mailing lists and ways to limit flame-wars [4]. ZRep is an interesting program that mirrors ZFS filesystems via regular snapshots and send/recv operations [5]. It seems that it could offer similar benefits to DRBD but at the file level and with greater reliability. James Lockyer gave a movingTEDx talk about his work in providing a legal defence for the wrongly convicted [6]. This has included overturning convictions after as much as half a century in which the falsely accused had already served a life sentence. Nathan Myers wrote an epic polemic about US government policy since 9-11 [7]. It s good to see that some Americans realise it s wrong. There is an insightful TED blog post about TED Fellow Salvatore Iaconesi who has brain cancer [8]. Apparently he had some problems with medical records in proprietary formats which made it difficult to get experts to properly assess his condition. Open document standards can be a matter of life and death and should be mandated by federal law. Paul Wayper wrote an interesting and amusing post about Emotional Computing which compares the strategies of Apple, MS, and the FOSS community among other things [9]. Kevin Allocca of Youtube gave an insightful TED talk about why videos go viral [10]. Jason Fried gave an interesting TED talk Why Work Doesn t Happen at Work [11]. His main issues are distraction and wasted time in meetings. He gives some good ideas for how to improve productivity. But they can also be used for sabotage. If someone doesn t like their employer then they could call for meetings, incite managers to call meetings, and book meetings so that they don t follow each other and thus waste more of the day (EG meetings at 1PM and 3PM instead of having the second meeting when the first finishes). Shyam Sankar gave an interesting TED talk about human computer cooperation [12]. He describes the success of human-computer partnerships in winning chess tournaments, protein folding, and other computational challenges. It seems that the limit for many types of computation will be the ability to get people and computers to work together efficiently. Cory Doctorow wrote an interesting and amusing article for Locus Magazine about some of the failings of modern sci-fi movies [13]. He is mainly concerned with pointless movies that get the science and technology aspects wrong and the way that the blockbuster budget process drives the development of such movies. Of course there are many other things wrong with sci-fi movies such as the fact that most of them are totally implausible (EG aliens who look like humans). The TED blog has an interesting interview with Catarina Mota about hacker spaces and open hardware [14]. Sociological Images has an interesting article about sporting behaviour [15]. They link to a very funny youtube video of a US high school football team who make the other team believe that they aren t playing until they win [16] Related posts:
  1. Links April 2012 Karen Tse gave an interesting TED talk about how to...
  2. Links March 2012 Washington s Blog has an informative summary of recent articles about...
  3. Links November 2011 Forbes has an interesting article about crowd-sourcing by criminals and...

11 September 2012

Jon Dowland: Archiving

Last year, via jwz, I watched the video "Archive Team: A distributed Preservation of Service Attack" from Defcon 19. I learned about the Archive Team and the work of Jason Scott. More recently I learned about the archive.org Shareware CD Archive. In the fledgling days of the Internet, shareware (and cover-mount) CD-ROMs were a popular way for files to be distributed. They therefore archive an interesting age in the history of modern Internet culture. Inspired by the above, I dug out some of my old shareware Doom CDs, ripped them, scanned their covers (where I had them) and uploaded them to archive.org. Here they are: They are all part of the growing Doom Level CD Collection. In most cases, the CDs are a super set of files that exist in the /idgames archive. I'm fairly sure there is some stuff on these CDs that never made out of BBSes or the AOL and CompuServe walled gardens onto the wider Internet, with the exception of these shovelware collections. A follow-on project would be to cross-reference their indexes with the /idgames archive and upload whats missing (that can be done so, legally.) Finally, I also had a single, solitary PC ZONE covermount CD that I held on to because it was a Quake (and Duke Nukem 3D) add-on special: From an archive-perspective, Quake has not fared as well as Doom did. The Internet was young when Doom was popular, the World Wide Web was not the all-encompassing thing it has become and by accident rather than design nearly all Doom add-ons ended up being uploaded to a single FTP server: The Walnut Creek CDROM FTP Server, to a sub-folder /idgames. This single archive was mirrored wide and has lived on past the death of Walnut Creek. Today, it is small enough for casual enthusiasts to mirror, and has been kept alive by volunteer admins. The most popular front-end is now http://www.doomworld.com/idgames/. The WWW had grown up by the time Quake came along. There is an /idgames2, but it was never as popular as /idgames was for Doom. The Quake modding community was centered around a series of commercial websites such as Planet Quake, later part of the Gamespy network. Sadly the vast majority of web pages on the old Planet Quake site and similar sites have died completely from bit-rot. Large chunks of the history of the Quake community are therefore lost to a sort-of technological dark age.

Next.

Previous.